Through the chapters of this book, we’ve explored how aspects of our biology – of our innate humanness – have had a defining influence on history.
We saw how our psychological software developed for social living and altruism and how widespread cooperation enabled us to undertake the enormous, coordinated venture that is civilisation. We saw how our unique reproductive behaviour gave rise to the human family, and how dynasties in different cultures addressed the problem of ensuring an heir. We discussed our susceptibility to infections and the ramifications of endemic disease and raging pandemics. We explored the power of demography, the large-scale properties of our populations and the consequences of our proclivity for exploiting psychoactive substances to alter our conscious experiences. We saw specific examples of the historical consequences of defective genes in our DNA. Finally, we looked at the multitude of cognitive glitches and biases that affect our behaviour.
Human history has played out in the balance between our faculties and our flaws as a species.
But we’ve not been powerless slaves to our biology. Human technological progress is the story of how we have endeavoured to enhance and augment our natural capabilities and compensate for or overcome many of our biological inadequacies.
Lacking the sharp claws or sabre-like teeth of other animals, we used stone tools such as the hand axe or pointed spear to hunt and butcher our prey, enabling us to enrich our diet with meat. The same weapons allowed us to protect ourselves against predators, and we also used them to fight each other.
The exploitation of fire for cooking, in combination with the development of pottery and clay vessels, allowed us to deactivate toxins and preserve and store foodstuffs. It also provided an external pre-digestion system to extract more nutrition from food. Likewise, the quern and then the millstone, grinding and pounding grains into flour, served as a technological extension of our molar teeth.
From the sewing of animal hides into clothing to the innovation of the loom for weaving textiles, we found insulation for our furless bodies and protection from the elements as we left tropical Africa and dispersed around the world to colder climes.
These developments are all aspects of human culture – learned behaviours and practices, passed from individual to individual and down the generations. Indeed, our capacity for cultural evolution is an immensely powerful force that has enabled humanity to surpass many of the constraints of our nature.
As we’ve explored in depth throughout this book, intrinsic factors of our biology have had a profound effect on the history of our societies and civilisations. But the opposite has also been true: humanity’s cultural innovations have left imprints on our genetic make-up. For example, with the domestication of the goat, the sheep and, in particular, the cow in the last 10,000 years, populations across Europe, the Middle East and parts of Africa and Asia began supplementing their diets with milk. Being mammals, we are nourished as infants by our mother’s milk, but after weaning we would naturally stop producing lactase, the key enzyme needed to digest milk. But among the modern descendants of those ancient populations that took up dairying, the gene for lactase has adapted to remain switched on through adulthood.1 We have evolved to be better biologically suited to our own cultural environment. Today, 95 per cent of northern Europeans exhibit ‘lactase persistence’ – therefore being able to pour milk liberally on their breakfast cereal or into their tea – whereas other populations around the world become sick if they try to drink milk as adults.2 So, not only have we been using cultural inventions to enhance our biological abilities, but these innovations have in turn changed our biology.
The pace of cultural change has accelerated enormously since the dawn of civilisation. We have developed ever more sophisticated technologies. Metalworking offered tools more versatile than those made of stone, as well as more durable weapons and protective armour for our soft flesh and fragile skulls. The invention of writing enabled us to vastly expand the information we could store beyond the memory capacity of our own brains and our oral traditions, and to communicate ideas across space and time to people we would never meet. Inscriptions on clay tablets, papyrus or parchment gave way to paper and the printing press, ultimately leading to the internet where we can access effectively limitless information from the palm of our hand. We invented spectacles to correct blurred eyesight, as well as the telescope and microscope to extend our vision into otherwise invisible realms. Modern medicine of antibiotics, vaccines and prophylactics support our immune system and protect us from disease; other pharmaceuticals mask the effects of genetic defects, while skilful surgery fixes the consequences of anatomical deformities or injuries. We’ve also been able to take control of our own reproduction: the use of condoms, hormonal birth-control pills and other contraceptives, as well as medically safe abortions, decouples sex from procreation and provides the choice over when and with whom we want to have children, thus allowing us to regulate family size and population growth. Modern technology can also aid reproduction when we encounter difficulties; in vitro fertilisation, for example, offers hope to otherwise infertile couples.
Through all these innovations, and many more, we’ve come to possess the power to complement our natural abilities and compensate for our limitations. So much so that today, in the developed world at least, the differential survival rate or reproductive success of individuals is rarely dependent on their genetics. Natural selection no longer has the raw material to operate, and the evolution of our species has effectively ceased.3
Most of us now live in an environment almost entirely created and controlled by ourselves. That is not to say, however, that we are no longer subject to the prescriptions of our biology.
Let’s look at just a couple of examples. In our modern urban society, fewer and fewer people work on fields and in factories. For many of us, whether we work in finance or in call centres, we sit down for long hours of the day, hunched over a desk. After countless generations of roaming hunter-gatherers, agricultural labourers and industrial workers, we are now almost entirely sedentary.
Indeed, we now spend so much of our working day sitting down – or even worse, so much of our time off slouched on the sofa – that the key postural muscles engaged in supporting our spine and holding us upright when we stand are atrophied. The vast majority of us will suffer from chronic back pain, specifically in the lower back, at some point in our lives. Moreover, near universal access to cars and public transport in the developed world strips us of the need for physical exertion in our daily comings and goings.
Foragers and farmers spend their lives active on their feet, but with the rise of the modern industrialised world an odd concept emerged – taking exercise. As day-to-day life has become so bereft of exertion, we have purposefully inserted it back into our routine. For the ancient Greeks in their gymnasiums (derived from gymnos, the Greek word for ‘naked’), exercising was a pastime for the privileged social classes with slaves who didn’t need to labour. Today, we rush to the gym before or after work to stay fit. We can only guess what medieval peasants would have made of the apparent absurdity of paying to walk or run on the spot, pounding out virtual miles on the treadmill (used as a form of punishment by hard labour in early-nineteenth-century British prisons, incidentally).
The shift to a largely indoor existence has also affected our vision. Eyesight tends to deteriorate with age as the lens of the eye becomes less flexible, making it harder to focus on nearby objects, meaning people tend to become long-sighted as they get older. But since the Victorian age there has been a phenomenal surge of the opposite problem – myopia, or short-sightedness – even among children. Our lives have become so dominated by looking close-up – especially at screens – and rarely spending time outside, scanning the distance, that myopia now affects up to 50 per cent of those living in urban environments.4
Modern life in the developed world has also led to a proliferation of allergic reactions, such as asthma, eczema, food allergies and hay fever (which is perhaps ironic considering that we evolved as a grassland species). These conditions are caused by the inflammation of soft tissues in our body and represent the immune system overreacting to innocuous triggers. While hygiene is key for reducing transmissible diseases, we have become overly cautious in keeping our homes fastidiously clean and stopping infants playing outside in the dirt. Yet the immune system needs to train itself to distinguish between genuine threats and harmless stimuli, and without early exposure to dust, bacteria and parasites, it doesn’t learn properly and so becomes hypersensitive and prone to allergic reactions.
What’s more, the leading causes of death in the developed world are no longer pestilence or famine but largely preventable conditions that we bring upon ourselves: obesity, diabetes, high blood pressure, heart disease. The problem behind these ‘lifestyle diseases’ is partly our effortless, immobile existence, but also the gluttony of fabulously rich foods we eat. We are facing epidemics not of infectious diseases but of the avoidable consequences of overconsumption and physical inactivity. Industrialised agriculture, with enormously efficient mechanisation and artificial fertilisers, pesticides and herbicides, provides an abundant harvest, and the mass-production of meat means it has also never been so cheap or more widely available. We live in a time of overwhelming bounty. But the issue is not just the amount of food at our disposal but the kind of meals we often choose to eat. On the whole, we don’t tend to consume unhealthy amounts of fresh fruit and veg. And the root cause of our overindulgence comes down to programming deep in our biological make-up.
For our ancestors in the African savannah, survival required carefully directed effort, and evolution therefore programmed our sense of taste to favour sources of vital nutrients and minerals that were scarce in this environment, such as sugar, fat and salt. But because human evolution cannot now keep up with cultural change, we have kept our palaeolithic palate and still crave the high-value foods which are today available in abundance.
Seen in this light, the totem of modern fast food – a cheeseburger with fries and a soft drink – is like an ancestral dream come true: greasy protein, encased in energy-rich carbs, sprinkled with moreish salt and washed down with concentrated sugar solution. It is almost uncannily well-crafted to hit each and every one of our primal dietary drives and light up the pleasure centres of our brain. These food types activate the dopamine reward pathway in the brain in just the same way as the other addictive substances we explored earlier.5 Eating such rich food feels not only satisfying in the short term, but compulsively so. Indeed, almost all the modern processed food we consume is loaded with fat, salt and sugar. And much of the meat we eat has been industrially ground into mince – we’ve even outsourced the effort of chewing to a factory.
In terms of providing calories for the human body, the energy-packed, soft, easily digestible nutrients of modern meals are effectively rocket fuel supplied to people who are barely mobile. There is a sharp disconnect between the ancestral environment in which our genes evolved and the modern world we have created for ourselves. So when we allow our ancestral urges to determine our diet, we become prone to so-called mismatch diseases. Surplus energy is stored by the body in fat reserves, causing obesity; excessive salt drives high blood pressure, contributing to heart disease; and the spikes in blood sugar levels result in diabetes.
Another major reason why we find it so hard to resist eating too much processed food and sweet treats, despite knowing full well they are unhealthy and fattening, comes down to a cognitive bias. We fail to act rationally by overvaluing immediate rewards while disregarding the longer-term consequences of our choices. This tendency makes evolutionary sense. In an uncertain or dangerous world, it pays to seize benefits immediately as you may not get the chance later, or to focus on a current threat rather than something that lies over the horizon. And in the modern world, the ‘present bias’ – a cognitive short-sightedness – rears its head not only in unwholesome eating habits, but also in opting to spend spare cash today rather than building up savings for the future, or in procrastinating by seeking instant gratification now and putting off chores or work until later (if ever!). It is also one of several cognitive biases that hinders us in responding effectively to serious but gradually developing problems such as climate change.
The fact that the Earth’s climate is warming due to human activity is scientifically very well established, and if we don’t act quickly and decisively, the consequences could be extremely dire indeed. Effective solutions to the problem require not only each of us to alter our individual behaviour to reduce greenhouse gas emissions, but also governments and industries to instigate top-down changes to policy and practices (themselves responding to what we as voters and consumers indicate we want). And I think it’s fair to say that most of us are now well aware of the changes we should be making to our current lifestyles. The problem is, we need to sacrifice immediate benefits – such as driving a big, comfortable car, flying away on summer holidays or enjoying meat or dairy products – for the sake of preserving the environment for the long-term future. (Although, perhaps most alarming about climate change is just how apparent the effects have already become in the past few years.) Even if we will personally gain from lifetime savings on fuel bills, the present bias deters us from buying energy-efficient appliances because of the higher upfront cost.
The sunk cost fallacy, which we explored in Chapter 8, also plays into the problem. The more time, energy or resources we’ve already committed to a particular approach, the more likely we are to stick with it even after it has become clear it would be beneficial to change tack. This cognitive bias is part of the reason our infrastructure continues its reliance on fossil fuels despite mounting evidence of the advantages of renewable or carbon-neutral alternatives.
And this is assuming, of course, that you accept the severity of the issue: a sizeable fraction of the public still does not believe in climate change. Most of us get our news from mass media outlets, which have become increasingly polarised in their ideological stances. Confirmation bias only serves to strengthen the conviction of those disbelieving the severity of the situation.fn1 We have seemingly been hardwired with a number of cognitive biases that impede our ability to take appropriate action to address seemingly distant, gradual and complex challenges such as climate change.
Cognitive biases, along with many other aspects of our biology and evolutionary past, have had a huge influence on the course of human history. And they still hold a powerful influence on the future we will create.7